CopyFromLocal exception DataNode cannot be started
CopyFromLocal: File/user/apple/test.txt. _ COPYING _ cocould only be replicated to 0 nodes instead of minReplication (= 1 ). there are 0 datanode (s) running and no node (s) are excluded in this operation.
This exception was reported when the hdfs dfs-copyFromLocal command was just executed. it was known that t
CopyFromLocal exception DataNode cannot be started, datanode cannot be started
CopyFromLocal: File/user/apple/test.txt. _ COPYING _ cocould only be replicated to 0 nodes instead of minReplication (= 1 ). there are 0 datanode (s) running and no node (s) are excluded in this operation.
This exception was reported when the hdfs dfs-copyFromLocal command was just e
Install times wrong: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project Hadoop-hdfs:an Ant B Uildexception has occured:input file/usr/local/hadoop-2.6.0-stable/hadoop-2.6.0-src/hadoop-hdfs-project/ Hadoop-hdfs/target/findbugsxml.xml
installation directory, execute hadoop jar hadoop-0.17.1-examples.jar wordcount input path and output path, you can see the word count statistics. Both the input and output paths here refer to the paths in HDFS. Therefore, you can first create an input path in HDFS by copying the directories in the local file system to HDFS:Hadoop DFS-copyfromlocal/home/wenchu/t
/will delete the/user/mdss/directory and subdirectories
Copying Files
Copying files from the local file system to the HDFs File system command: copyfromlocal
Hadoop fs–copyfromlocal Example.txt/user/mdss/example.txt
Copying files from the HDFs file system to the local file system command: copytolocal
Hadoop Fs–copytolo
a super-user. additional information is in the HDFS admin guide: permissions.
Chmod
Usage: hadoop FS-chmod [-R]
Change the permissions of files. with-R, make the change recursively through the directory structure. the user must be the owner of the file, or else a super-user. additional information is in the HDFS admin guide: permissions.
Chown
Usage: hadoop FS-chown [-R] [owner] [: [group] URI [URI]
Chang
FS Shell
Use bin/hadoop FS
Cat
Usage:
hadoop fs -cat URI [URI …]
Output the content of the specified file in the path to stdout.
Example:
hadoop fs -cat hdfs://host1:port1/file1 hdfs://host2:port2/file2 hadoop fs -cat file:///file3 /user/hadoop/file4
Chgrp
Usage:
. The user of the command must be the owner of the file or the superuser. For more information, see the HDFs Permissions User's Guide.
Chown
How to use: Hadoop Fs-chown [-R] [Owner][:[group]] uri [URI]
Change the owner of the file. Using-R causes changes to be recursively performed under the directory structure. The user of the command must be a superuser. For more information, see the HDFs Permissions User's Guide.
FS Shell
Cat
Chgrp
chmod
Chown
Copyfromlocal
Copytolocal
Cp
Du
Dus
Expunge
Get
Getmerge
Ls
Lsr
Mkdir
Movefromlocal
Mv
Put
Rm
RMr
Setrep
Stat
Tail
Test
Text
Touchz
FS ShellThe call file system (FS) shell command should use the form Bin/hadoop
Original address: http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_shell.html
FS Shell
Cat
Chgrp
chmod
Chown
Copyfromlocal
Copytolocal
Cp
Du
Dus
Expunge
Get
Getmerge
Ls
Lsr
Mkdir
Movefromlocal
Mv
Put
Rm
RMr
Setrep
Stat
Tail
Test
Text
Touchz
FS ShellThe call file system (FS) she
following rules:
It is preferred to read data on the local rack.
Commands commonly used in HDFS
1. hadoop fs
Hadoop fs-ls/hadoop fs-lsr hadoop fs-mkdir/user/hadoop fs-put a.txt/user/hadoop fs-get/user/
:~$ echo "Hello" > Hello.txthwl@hadoop-master:~$ sudo-u HDFs Hadoop fs-mkdir/hwl14/05/11 19:31:52 INFO Security. Usergroupinformation:jaas Configuration already set up to Hadoop, not re-installing.hwl@hadoop-master:~$ sudo-u HDFs Hadoop fs-
, create the input directory in DFS.
hadoop@ubuntu:/usr/local/hadoop$ bin/hadoop dfs -mkdir input
Copy the file in conf to the input file in DFS.
hadoop@ubuntu:/usr/local/hadoop$ hadoop dfs -c
-copyfromlocal In addition to qualifying the source path as a local file, it is similar to the put command.CopytolocalHow to use: Hadoop fs-copytolocal [-IGNORECRC] [-CRC] URI In addition to qualifying the target path as a local file, it is similar to the get command.CpHow to use: Hadoop fs-cp uri [uri ...] Copies the file from the source path to the destination
from conf to the input in DFS
hadoop@ubuntu:/usr/local/hadoop$ Hadoop dfs-copyfromlocal conf/* inputRunning WordCount in pseudo-distributed mode
hadoop@ubuntu:/usr/local/hadoop$ Hadoop
-chmod [-r]
change the permissions on the file. Using-R causes changes to be recursively performed under the directory structure. The user of the command must be the owner of the file or the superuser. For more information, see the HDFs Permissions User's Guide. chown
How to use: Hadoop Fs-chown [-R] [Owner][:[group]] uri [URI]
change the owner of the file. Using-R causes changes to be recursively performed under the directory structure. The use
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.